multi-layer perceptron

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

A multi-layer perceptron is an early multi-layer neural network architecture that had precisely three layers: input, hidden and output. The combination of sigmoid threshold functions and backpropagation enabled the inner hidden layer to learn weights, thus beginning the modern field of neural networks.

Used in Chap. 6: pages 83, 84, 86, 87, 90; Chap. 7: page 107

A multi-layer perceptron architecture.

A simple multi-layer perceptron to solve the XOR problem.